160 research outputs found

    On the parametrization of clapping

    Get PDF
    For a Reactive Virtual Trainer(RVT), subtle timing and lifelikeness\ud of motion is of primary importance. To allow for reactivity, movement\ud adaptation, like a change of tempo, is necessary. In this paper we\ud investigate the relation between movement tempo, its synchronization to\ud verbal counting, time distribution, amplitude, and left-right symmetry of\ud a clapping movement. We analyze motion capture data of two subjects\ud performing a clapping exercise, both freely and timed by a metronome.\ud Our findings are compared to existing gesture research and existing biomechanical models. We found that, for our subjects, verbal counting adheres\ud to the phonological synchrony rule. A linear relationship between\ud the movement path length and the tempo was found. The symmetry between\ud the left and the right hand can be described by the biomechanical\ud model of two coupled oscillators

    A Demonstration of Continuous Interaction with Elckerlyc

    Get PDF
    We discuss behavior planning in the style of the SAIBA framework for continuous (as opposed to turn-based) interaction. Such interaction requires the real-time application of minor shape or timing modifications of running behavior and anticipation of behavior of a (human) interaction partner. We discuss how behavior (re)planning and on-the-fly parameter modification fit into the current SAIBA framework, and what type of language or architecture extensions might be necessary. Our BML realizer Elckerlyc provides flexible mechanisms for both the specification and the execution of modifications to running behavior. We show how these mechanisms are used in a virtual trainer and two turn taking scenarios

    Establishing Rapport with a Virtual Dancer

    Get PDF
    We discuss an embodied agent that acts as a dancer and invites human partners to dance with her. The dancer has a repertoire of gestures and moves obtained from inverse kinematics and motion capturing that can be combined in order to dance both on the beat of the music that is provided to the dancer and sensor input (visual and dance pad) from a human partner made available to the virtual dancer. The interaction between virtual dancer and human dancer allows alternating ‘lead’ ad ‘follow’ behavior, both from the point of view of the virtual and the human dancer

    Measuring Behavior using Motion Capture

    Get PDF
    Motion capture systems, using optical, magnetic or mechanical sensors are now widely used to record\ud human motion. Motion capture provides us with precise measurements of human motion at a very high\ud recording frequency and accuracy, resulting in a massive amount of movement data on several joints of the\ud body or markers of the face. But how do we make sure that we record the right things? And how can we\ud correctly interpret the recorded data?\ud In this multi-disciplinary symposium, speakers from the field of biomechanics, computer animation, human\ud computer interaction and behavior science come together to discus their methods to both record motion and\ud to extract useful properties from the data. In these fields, the construction of human movement models from\ud motion capture data is the focal point, although the application of such models differs per field. Such\ud models can be used to generate and evaluate highly adaptable and believable animation on virtual\ud characters in computer animation, to explore the details of gesture interaction in Human Computer\ud Interaction applications, to identify patterns related to affective states or to find biomechanical properties of\ud human movement

    Leading and following with a virtual trainer

    Get PDF
    This paper describes experiments with a virtual fitness trainer capable of mutually coordinated interaction. The virtual human co-exercises along with the user, leading as well as following in tempo, to motivate the user and to influence the speed with which the user performs the exercises. In a series of three experiments (20 participants in total) we attempted to influence the users' performance by manipulating the (timing of the) exercise behavior of the virtual trainer. The results show that it is possible to do this implicitly, using only micro adjustments to its bodily behavior. As such, the system is a rst step in the direction of mutually coordinated bodily interaction for virtual humans

    An Animation Framework for Continuous Interaction with Reactive Virtual Humans

    Get PDF
    We present a complete framework for animation of Reactive Virtual Humans that offers a mixed animation paradigm: control of different body parts switches between keyframe animation, procedural animation and physical simulation, depending on the requirements of the moment. This framework implements novel techniques to support real-time continuous interaction. It is demonstrated on our interactive Virtual Conductor

    Elckerlyc in practice - on the integration of a BML Realizer in real applications

    Get PDF
    Building a complete virtual human application from scratch is a daunting task, and it makes sense to rely on existing platforms for behavior generation. When building such an interactive application, one needs to be able to adapt and extend the capabilities of the virtual human offered by the platform, without having to make invasive modications to the platform itself. This paper describes how Elckerlyc, a novel platform for controlling a virtual human, offers these possibilities
    • …
    corecore